Differentiable programming of isometric tensor networks

نویسندگان

چکیده

Abstract Differentiable programming is a new paradigm which enables large scale optimization through automatic calculation of gradients also known as auto-differentiation. This concept emerges from deep learning, and has been generalized to tensor network optimizations. Here, we extend the differentiable networks with isometric constraints applications multiscale entanglement renormalization ansatz (MERA) (TNR). By introducing several gradient-based methods for comparing Evenbly–Vidal method, show that auto-differentiation better performance both stability accuracy. We numerically tested our on 1D critical quantum Ising spin chain 2D classical model. calculate ground state energy model internal model, scaling dimensions operators find they all agree theory well.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Isometric Differentiable Functions on Real Normed Space1

From now on S, T ,W , Y denote real normed spaces, f , f1, f2 denote partial functions from S to T , Z denotes a subset of S, and i, n denote natural numbers. Now we state the propositions: (1) Let us consider a set X and functions I, f . Then (f X) · I = (f · I) I−1(X). (2) Let us consider real normed spaces S, T , a linear operator L from S into T , and points x, y of S. Then L(x)− L(y) = L(x...

متن کامل

Differentiable Genetic Programming

We introduce the use of high order automatic differentiation, implemented via the algebra of truncated Taylor polynomials, in genetic programming. Using the Cartesian Genetic Programming encoding we obtain a high-order Taylor representation of the program output that is then used to back-propagate errors during learning. The resulting machine learning framework is called differentiable Cartesia...

متن کامل

Programming with a Differentiable Forth Interpreter

There are families of neural networks that can learn to compute any function, provided sufficient training data. However, given that in practice training data is scarce for all but a small set of problems, a core question is how to incorporate prior knowledge into a model. Here we consider the case of prior procedural knowledge, such as knowing the overall recursive structure of a sequence tran...

متن کامل

A differentiable approach to inductive logic programming

Recent work in neural abstract machines has proposed many useful techniques to learn sequences of applications of discrete but differentiable operators. These techniques allow us to model traditionally procedural problems using neural networks. In this work, we are interested in using neural networks to learn to perform logic reasoning. We propose a model that has access to differentiable opera...

متن کامل

Neural networks with differentiable structure

While gradient descent has proven highly successful in learning connection weights for neural networks, the actual structure of these networks is usually determined by hand, or by other optimization algorithms. Here we describe a simple method to make network structure differentiable, and therefore accessible to gradient descent. We test this method on recurrent neural networks applied to simpl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Machine learning: science and technology

سال: 2022

ISSN: ['2632-2153']

DOI: https://doi.org/10.1088/2632-2153/ac48a2